MMEFX Fake News

Multi Media Enforcment Information For Fake News

Fake news is disinformation that is disseminated for political purposes, economic gain, or entertainment.

Bryan Cranston and SAG-AFTRA & Michael Jackson Deepfake

Bryan Cranston & Michael Jackson Deepfake

After unauthorized AI deepfakes of Bryan Cranston appeared on OpenAI's Sora 2, he alerted the Screen Actors Guild–American Federation of Television and Radio Artists (SAG-AFTRA). The incident, which included a video of Cranston and a deepfaked Michael Jackson, was discussed on Reddit and led to a collaboration between SAG-AFTRA, Cranston, and OpenAI to enhance AI protection for performers.

Back To All Stories

Bryan Cranston and SAG-AFTRA address deepfakes

  • When OpenAI launched the upgraded Sora 2 video generator in September 2025, a flood of realistic, unauthorized deepfakes featuring celebrities began appearing online.
  • The incident involving Cranston and a "synthetic Michael Jackson" was a significant catalyst for action. The AI-generated video showed a likeness of the late singer taking a selfie with an image of Cranston's Breaking Bad character.
  • Although OpenAI's official policy for Sora 2 required explicit "opt-in" consent for using a living public figure's likeness, these unauthorized videos showed that the guardrails were failing. 

Cranston and SAG-AFTRA's response

  • After discovering the misuse of his voice and image, Cranston immediately brought the issue to his union.
  • SAG-AFTRA and major talent agencies collectively pressured OpenAI to improve its protocols.
  • Cranston had previously spoken out against the misuse of AI, giving a fiery speech on the topic at a SAG-AFTRA rally in 2023. 

The outcome

  • On October 20, 2025, Cranston, SAG-AFTRA, and OpenAI released a joint statement confirming that the AI company had "strengthened guardrails" around its opt-in policy.
  • In the statement, Cranston thanked OpenAI for improving its protections. He also expressed gratitude to OpenAI for its policy and for improving its guardrails. Cranston hoped that all companies involved in this work would respect personal and professional rights to manage replication of voice and likeness.
  • SAG-AFTRA noted that performers still need better legal protections against AI "replication technology," even though Cranston's case had a positive outcome. The union referenced the NO FAKES Act, which the U.S. Congress is considering.
  • OpenAI also had to address deepfakes of deceased figures, including Martin Luther King Jr. and Michael Jackson. OpenAI's policy does not protect deceased celebrities from AI use, which has been a point of contention for estates and copyright holders. However, OpenAI did block some disrespectful depictions of King after complaints from his estate. 

Broader context for AI and SAG-AFTRA

  • California legislation: The push for AI protection in entertainment has extended beyond the unions. In September 2024, California passed new laws sponsored by SAG-AFTRA that require consent for the use of digital replicas of performers, living or deceased.
  • NO FAKES Act: SAG-AFTRA also advocates for the federal NO FAKES Act, which would ban the non-consensual use of digital replicas and has gained support from other industry groups.
  • SAG-AFTRA policies: The union's own policies require informed consent and fair compensation for the creation and use of digital replicas, and a new agreement with the company Narrativ addresses AI use in audio commercials. 

 

 

Oct 21 2025